Linearly convergent gradient-free methods for minimization of parabolic approximation
نویسندگان
چکیده
Нахождение глобального минимума невыпуклых функций — одна из ключевых и самых сложных проблем современной оптимизации. В этой работе мы рассматриваем отдельные классы задач, которые имеют четкий выраженный глобальный минимум.
منابع مشابه
A Linearly Convergent Dual-Based Gradient Projection Algorithm for Quadratically Constrained Convex Minimization
This paper presents a new dual formulation for quadratically constrained convex programs (QCCP). The special structure of the derived dual problem allows to apply the gradient projection algorithm to produce a simple explicit method involving only elementary vector-matrix operations, that is proven to converge at a linear rate.
متن کاملLinearly implicit methods for nonlinear parabolic equations
We construct and analyze combinations of rational implicit and explicit multistep methods for nonlinear parabolic equations. The resulting schemes are linearly implicit and include as particular cases implicit-explicit multistep schemes as well as the combination of implicit Runge-Kutta schemes and extrapolation. An optimal condition for the stability constant is derived under which the schemes...
متن کاملPattern Search Methods for Linearly Constrained Minimization
We extend pattern search methods to linearly constrained minimization. We develop a general class of feasible point pattern search algorithms and prove global convergence to a KarushKuhn-Tucker point. As in the case of unconstrained minimization, pattern search methods for linearly constrained problems accomplish this without explicit recourse to the gradient or the directional derivative of th...
متن کاملSpectral gradient methods for linearly constrained optimization
Linearly constrained optimization problems with simple bounds are considered in the present work. First, a preconditioned spectral gradient method is defined for the case in which no simple bounds are present. This algorithm can be viewed as a quasiNewton method in which the approximate Hessians satisfy a weak secant equation. The spectral choice of steplength is embedded into the Hessian appro...
متن کاملConvergent Subgradient Methods for Nonsmooth Convex Minimization
In this paper, we develop new subgradient methods for solving nonsmooth convex optimization problems. These methods are the first ones, for which the whole sequence of test points is endowed with the worst-case performance guarantees. The new methods are derived from a relaxed estimating sequences condition, which allows reconstruction of the approximate primal-dual optimal solutions. Our metho...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Komp?ûternye issledovaniâ i modelirovanie
سال: 2022
ISSN: ['2076-7633', '2077-6853']
DOI: https://doi.org/10.20537/2076-7633-2022-14-2-239-255